Overview
What is Informatica PowerCenter?
Informatica PowerCenter is a metadata driven data integration technology designed to form the foundation for data integration initiatives, including analytics and data warehousing, application migration, or consolidation and data governance.
Informatica PowerExchange Connectors: the data with quality and reliability for the enterprise
Old yet powerful ETL Tool
Mainly …
Amazing ETL Tool
The data integration leader
Informatica Power Center review
PowerCenter outpowers the cloud version
Informatica PowerCenter, an Enterprise ETL Tool
A powerful ETL solution which focuses on enterprise scalability, flexibility, and code re-usability
Sheer Performance (but not for MacOS!)
Licensing flexibility and skills availability in the market
Informatica Review
Good ETL Platform
PowerCenter works well with large, structured data files
My personal view of Informatica Enterprise Data Integration tools based on my 10+ years of user experiences.
Day to day user's opinion of PowerCenter
Awards
Products that are considered exceptional by their customers based on a variety of criteria win TrustRadius awards. Learn more about the types of TrustRadius awards to make the best purchase decision. More about TrustRadius Awards
Popular Features
- Connect to traditional data sources (18)9.090%
- Business rules and workflow (18)9.090%
- Simple transformations (18)8.080%
- Complex transformations (18)7.070%
Pricing
What is Informatica PowerCenter?
Informatica PowerCenter is a metadata driven data integration technology designed to form the foundation for data integration initiatives, including analytics and data warehousing, application migration, or consolidation and data governance.
Entry-level set up fee?
- No setup fee
Offerings
- Free Trial
- Free/Freemium Version
- Premium Consulting/Integration Services
Would you like us to let the vendor know that you want pricing?
94 people also want pricing
Alternatives Pricing
What is Clear Analytics?
Clear Analytics is a business intelligence solution that enables non technical end users to perform analytics by leveraging existing knowledge of Excel coupled with a built in query builder. Some key features include: Dynamic Data Refresh, Data Share and In-Excel Collaboration.
What is Vertify?
VertifyData is a cloud-based integration platform with core integration capacities, including a drag-and-drop interface and real-time synchronization. It also offers over 80 prebuilt connectors and templates, plus customizable integrations for scaling businesses.
Product Demos
#Informatica #Source #Qualifier Online #Tutorial For Beginners - Part 14
#Informatica #Client Tools #Training Session For Beginners- Part 4 || Video Course
INFORMATICA ONLINE TRAINING Session | Free Tutorials | Materials | ETL Tool
Features
Data Source Connection
Ability to connect to multiple data sources
- 9Connect to traditional data sources(18) Ratings
Ability to connect to traditional data sources like relational databases, flat files, XML files and packaged applications
- 8Connecto to Big Data and NoSQL(14) Ratings
Ability to connect to non-traditional data sources like Hadoop and other big data technologies, and NoSQL databases
Data Transformations
Data transformations include calculations, search and replace, data normalization and data parsing
- 8Simple transformations(18) Ratings
Simple data transformations are calculations, data type conversions, aggregations and search and replace operations
- 7Complex transformations(18) Ratings
Complex data transformations are data normalization, advanced data parsing, etc.
Data Modeling
A data model is a diagram or flowchart that illustrates the relationships between data
- 9Data model creation(15) Ratings
Ability to create and maintain data models using a graphical tool to define relationships between data
- 8Metadata management(16) Ratings
Automated discovery of metadata with ability to synchronize and share metadata with other tools like Master Data Management
- 9Business rules and workflow(18) Ratings
Ability to define and manage business rules and workflows
- 6.1Collaboration(16) Ratings
Collaboration is enabled by a shared repository of project information and metadata
- 9Testing and debugging(17) Ratings
Tool to debug and tune for optimal performance
Data Governance
Data governance is the practise of implementing policies defining effective use of an organization's data assets
- 9Integration with data quality tools(15) Ratings
Integration with tools for cleansing, parsing and normalizing data according to business rules
- 9Integration with MDM tools(13) Ratings
Integration with master data management tools to ensure data consistency across the organization
Product Details
- About
- Tech Details
- FAQs
What is Informatica PowerCenter?
Informatica PowerCenter Technical Details
Operating Systems | Unspecified |
---|---|
Mobile Application | No |
Frequently Asked Questions
Comparisons
Compare with
Reviews and Ratings
(92)Community Insights
- Business Problems Solved
- Pros
- Cons
- Recommendations
PowerCenter is widely used across organizations for a variety of use cases. Users have found the product to be instrumental in addressing their data integration needs and solving challenges related to data quality management, master data management, data masking, and data virtualization. With PowerCenter, users can easily extract, transform, and load data from various sources such as SAP and Salesforce into their data warehouses and data marts. They appreciate how the product optimizes data for reporting tools and facilitates the generation of accurate reports.
Another common use case for PowerCenter is integrating data from disparate sources and migrating data from legacy systems to newer infrastructure. Users have found the product to be efficient in automating ETL processes and ensuring timely and efficient data loads. They also value its advanced security features, which make administration easier. Many users have praised PowerCenter's ease of use, fast development capabilities, and multi-user development environment with a check-out/check-in mechanism that supports collaborative work.
Overall, PowerCenter has become a mainstay ETL tool in organizations due to its scalability, flexibility, and robustness as an ETL engine. From loading data into data warehouses and data marts to feeding information into COTs applications or Teradata Data Warehouses, PowerCenter has proven to be an indispensable tool for enterprise-wide data integration across a range of industries including marketing, healthcare, finance, and more.
Seamless data integration capabilities: Multiple reviewers have praised Informatica PowerCenter for its seamless data integration capabilities, with users stating that it effortlessly connects to multiple sources and targets. This feature simplifies the complex process of integrating data from various systems, providing a significant advantage in terms of efficiency and productivity.
User-friendly interface: The user-friendly interface of Informatica PowerCenter has been consistently appreciated by reviewers, who find it highly intuitive and easy to navigate. Users state that the application of complicated business rule logic is straightforward, thanks to the intuitive design process for creating ETL mappings and workflows. The minimal learning time and effort required to work with PowerCenter's interface are seen as significant advantages.
Reusability and streamlined development processes: Reviewers have highlighted PowerCenter's ability to capture and share logic in mapplets, enabling reusability and streamlining development processes. This feature contributes to a seamless workflow for developers, making the development and maintenance of code in PowerCenter straightforward. Users value this functionality as it enhances efficiency and promotes consistency across projects.
Lack of Documentation: Several users have expressed frustration with the limited availability and insufficient documentation for Informatica PowerCenter. This has made it challenging for users to achieve advanced tasks and work with complex workflows, hindering their ability to fully utilize the software.
Complex Installation and Configuration: Many reviewers have encountered difficulties during the installation and configuration process of Informatica PowerCenter. The software consists of multiple components that require significant time and resources to set up effectively, causing inconvenience for users.
Limited Integration Capabilities: Some users have found it difficult to integrate code from other languages like Java, Python, or R into Informatica PowerCenter. This limitation forces users to build multiple hops in a data pipeline, resulting in additional complexity and potential inefficiencies in their workflow.
Users recommend Informatica PowerCenter for various tasks such as data migration, ETL, and data integration. They find it to be a reliable tool with strong support and powerful mapping and visualization capabilities. Additionally, users mention its ability to connect and fetch data from different source systems, as well as its capabilities for processing and transforming data. Informatica PowerCenter is highly recommended for companies dealing with large amounts of data and those in need of real-time data integration. Users also mention its plugins for integrating with other applications. While some users acknowledge that Informatica PowerCenter is more expensive than other software, they still recommend it as the best ETL tool in the market.
Attribute Ratings
Reviews
(1-7 of 7)Amazing ETL Tool
- Extracting, transforming and loading data to and from different databases.
- Development, testing and maintenance of code is fairly simple.
- Good performance even during processing of large amount of data.
- Amazing ability to connect with different types of sources and targets available in the market.
- Performance with ODBC drivers is comparatively slow which ends up in taking a lot of time.
- Deployment is bit complex.
- Clobs and Blobs datatypes are very difficult to handle.
A powerful ETL solution which focuses on enterprise scalability, flexibility, and code re-usability
- Enforces enterprise wide ETL development standards.
- Provides code re-usability with shared connections and objects.
- Particularly adept at integrating a wide range of disparate data sources (handles flat files particularly well).
- Well suited for moving large amounts of data.
- There are too many ways to perform the same or similar functions which in turn makes it challenging to trace what a workflow is doing and at which point (ex. sessions can be designed as static or re-usable and the override can occur at the session or workflow, or both which can be counter productive and confusing when troubleshooting).
- The power in structured design is a double edged sword. Simple tasks for a POC can become cumbersome. Ex. if you want to move some data to test a process, you first have to create your sources by importing them which means an ODBC connection or similar will need to be configured, you in turn have to develop your targets and all of the essential building blocks before being able to begin actual development. While I am on sources and targets, I think of a table definition as just that and find it counter intuitive to have to design a table as both a source and target and manage them as different objects. It would be more intuitive to have a table definition and its source/target properties defined by where you drag and drop it in the mapping.
- There are no checkpoints or data viewer type functions without designing an entire mapping and workflow. If you would like to simply run a job up to a point and check the throughput, an entire mapping needs to be completed and you would workaround this by creating a flat file target.
For small projects or even smaller development teams with mostly a single data source, expect frustration with being able to quickly test a solution as the design flow is very structured. It is also designed in a way that segregation of duties at a very high level can also cause small development teams to be counter-productive. Each step in the design process is a separate application, and although stitched together, is not without its problems. In order to design a simple mapping for example, you would first need a connection established to the source (example, ODBC) and keep in mind that it will automatically name the container according to how you named your connection. You would then open the designer tool, import a connection as a source, optionally check it in, create a target, optionally check it in as well, and design a transformation mapping. In order to test or run it, you will need to open a separate application (Workflow Manager) and create a workflow from your mapping, then create a session for that workflow and a workflow for those one or more sessions at which point you can test it. After running it, in order to observe, you then need to open a separate application (Monitor) to see what it is doing and how well. For a developer coming from something like SSIS, this can be daunting and cumbersome for building a simple POC and trying to test it (although from the inverse, building an enterprise scalable ETL solution from SSIS is its own challenge).
Good ETL Platform
- It is quite flexible to handle different type of data formats like DB objects or Tables
- Good performance when processing lots of data in batch
- Easy to learn and use
- The client is quite heavy in terms of size and function
- Better way to upgrade
- Test and deployment automation needs to be improved
PowerCenter works well with large, structured data files
- PowerCenter processes input files, performs specified transformations, and maps the input data format to the output data format very quickly. The PowerCenter backend implementation seems to be optimized to process and map structured input records to structured output records and load the records into a database. One of the strengths of PowerCenter is performance of processing petabytes of structured input data files.
- PowerCenter does not require a software development experience or education. After providing initial hands-on training, the data consultants (who are statisticians, subject matter experts) in our organization were able to implement data ingest and data transformation tasks fairly easily.
- PowerCenter supports multiple DBMS technologies (for example, Oracle, Netezza). This flexibility allows it to be used by multiple departments within our organization.
- One of the challenges of PowerCenter is the lack of integration between the components and functionality provided by PowerCenter. PowerCenter consists of multiple components such has the repository service, integration service, metadata service. Considerable time and resources were required to install and configure these components before PowerCenter was available for use.
- In order to connect to various data sources such as Netezza database or SAS datasets, PowerCenter requires the installation and configuration of separate plug-ins. We spent considerable time trouble-shooting and debugging problems while trying to get the various plug-ins integrated with PowerCenter and get them up and running as described in the documentation.
- PowerCenter works well with structured data. That is, it is easy to work with input and output data that is pre-defined, fixed, and unchanging. It is much more difficult to work with dynamic data in which new fields are added or removed ad-hoc or if data format changes during the data ingest process. We have not been as successful in using PowerCenter for dynamic data.
- One of the challenges of learning PowerCenter is that it is difficult to find documentation or publications that help you learn the various details about PowerCenter software. Unlike SAS Institute, Informatica does not publish books about PowerCenter. The documentation available with PowerCenter is sparse; we have learned many aspects of this technology through trial and error.
PowerCenter is well suited for processing of large amounts of data that is structured and pre-defined. It is well-suited for large organizations that have the resources to install, configure and support PowerCenter. It is well suited for large organizations that have a large number of data consultants/analysts that do not have a software development/programming background.
PowerCenter is not a good fit for smaller, agile organizations that work with unstructured data and changing/dynamic data.
My personal view of Informatica Enterprise Data Integration tools based on my 10+ years of user experiences.
- Informatica Power Center tools are very effective on extracting, transforming and loading bulk data such as hundreds of millions of database rows or raw data flat files, XML files etc., out of or into all kinds of platforms such as databases (Oracle, DB2, SQL server, MySQL, any database that you can name it), OS (UNIX, Linux, Windows), web services, and real time such as IBM MQ.
- A single toolset to be used for all kinds of platforms, for bulk data loads, for real-time loads, or web services.
- The Informatica web service interface can be improved from a performance point of view and also from a troubleshooting point of view.
PowerCenter for ETL
- Data migration from multiple sources. It handles any type of source data including RDBMS, flat files, XML, mainframe etc.
- Implementing data migration rules is very easy and efficient and reusable.
- Development and maintenance of code is very easy. Design, development and scheduling of full load and incremental load is very easy.
- Provides lot of features for developers to implement any kind of business rules.
- Not worth doing simple data migration using powercenter.
- Licensing cost.
- Handling Blobs and clobs data types.
One stop for all ETL needs
- Ability to work with different types of sources and targets.
- Version control of the code.
- Ability to integrate with LDAP for security.
- Ease of use.
- Performance with ODBC drivers can be improved.
- Memory utilization is very high during ETL execution.
- Improved scheduling capabilities for in built scheduler.